Validation Master Plans (VMP)
-
discuss validation activities across an entire site or within an organization. The Validation Master Plan is a summary of validation strategy. The purpose of the Validation Master Plan is to document the compliance requirements for the site and to ensure that sufficient resources are available for validation projects.
Sometimes Validation Master Plans are written to cover specific departmental validation activities or the validation process for a specific type of system (for example, all programmable logic controllers (PLCs) within a manufacturing process). These master plans describe the specific validation process for that group or system type. Master plans are written to assist an organization with validation strategy or to provide control over a specific process.
The Validation Master Plan is different from a validation procedure (SOP), which describes the specific process for performing validation activities. When plans are written specifically for a single validation project, they are referred to as Validation Plans. Sometimes master plans are named for their function area, such as a Site Validation Master Plan, Pharmaceutical Validation Master Plan, or Software Master Plan.
The Validation Master Plan includes:
- Systems, equipment, methods, facilities, etc., that are in the scope of the plan
- Current validation status for the systems within the project scope
- Compliance requirements for validation, including how the validated state will be maintained
- Schedule of validation activities
Validation Master Plans can also include:
- Required validation deliverables
- Validation documentation format
- Current validation procedures and policies
- General validation risk mitigation strategy
Validation Master Plans should be approved by the head of Site Quality, plus other senior department heads as appropriate. Senior management approval is necessary for Validation Master Plans because their support is essential for the success of the plan.
Validation Plans(VP) - define the scope and goals of a validation project. The Validation Plan is written at the start of the validation project (sometimes concurrently with the user requirement specification) and is usually specific to a single validation project.
The collection of documents produced during a validation project is called a Validation Package. Once the validation project is complete, all documents in the validation package should be stored according to your site document control procedures.
Validation Plans are different than Validation Master Plans. Validation Plans are usually project specific; Validation Master Plans govern validation activities for an entire organization or site. Sometimes plans are also named for the applicable subject area, such as a Software Validation Plan.
A Validation Plan should include:
- Deliverables (documents) to be generated during the validation process
- Resources, departments, and personnel to participate in the validation project
- Time-lines for completing the validation project
- Acceptance criteria to confirm that the system meets defined requirements
- Compliance requirements for the system, including how the system will meet these requirements
The plan should be written with an amount of detail that reflects system complexity.
The plans should be approved, at a minimum, by the System Owner and Quality Assurance. Once approved, the plan should be retained according to your site document control procedures.
Risk Assessment (RA) documents potential business and compliance risks associated with a system and the strategies that will be used to mitagate those risks. Risk Assessments justify allocation of validation resources and can streamline the testing process. They also serve as a forum for users, developers, system owners, and quality to discuss the system which can have other intangible benefits. 21 CFR 11 does not require risk assessments, but Annex 11 does require a risk-management strategy.
Assigning risk should be a multi-disciplinary function. System owners, key end-users, system developers, information technology, engineers, and Quality should all participate if they are involved with the system. The Risk Assessment should be signed by the personnel who participated in the assessment.
There are many methods for Risk Assessment, but they generally all include rating risk for each requirement in at least three specific categories:
- Criticality – How important a function is to system functionality. Low criticality means that the system can continue to function relatively normally, even if the function is completely compromised. High risk means that if the function is damaged, one of the primary functions of the system cannot be accomplished.
- Detectability – The ease of detecting an issue arising with a particular function. It is more risky if there is a low chance of detectability; high chances of detectability correspond to lower risk.
- Probability – The probability of an issue arising with a particular function. Low probability means there is little chance that the function will fail; high probability means there is a high chance that the function will fail.
User Requirement Document or Specifications (URD or URS) - describes the business needs for what users require from the system. User Requirements Specifications are written early in the validation process, typically before the system is created. They are written by the system owner and end-users, with input from Quality Assurance. Requirements outlined in the URS are usually tested in the Performance Qualification or User Acceptance Testing. User Requirements Specifications are not intended to be a technical document; readers with only a general knowledge of the system should be able to understand the requirements outlined in the URS.
The URS is generally a planning document, created when a business is planning on acquiring a system and is trying to determine specific needs. When a system has already been created or acquired, or for less complex systems, the user requirement specification can be combined with the functional requirements document.
Good requirements are objective and testable. For example:
- Screen A accepts production information, including Lot, Product Number, and Date.
- System B produces the Lab Summary Report.
- Twenty users can use System C concurrently without noticeable system delays.
- Screen D can print on-screen data to the printer.
- System E will be compliant with 21 CFR 11.
The URS should include:
- Introduction – including the scope of the system, key objectives for the project, and the applicable regulatory concerns
- Program Requirements – the functions and workflow that the system must be able to perform
- Data Requirements – the type of information that a system must be able to process
- Life Cycle Requirements – including how the system will be maintain and users trained
For more examples and templates, see the User Requirements Specification Template.
Requirements are usually provided with a unique identifier, such as an ID#, to aid in traceability throughout the validation process.
User Requirements Specifications should be signed by the system owner, key end-users, and Quality. Once approved, the URS is retained according to your organization’s practices for document retention.
Functional Requirements, Functional Requirement Specifications, Functional Specs (FRS, FS) -
documents the operations and activities that a system must be able to perform.
Functional Requirements should include:
- Descriptions of data to be entered into the system
- Descriptions of operations performed by each screen
- Descriptions of work-flows performed by the system
- Descriptions of system reports or other outputs
- Who can enter the data into the system
- How the system meets applicable regulatory requirements
The Functional Requirements Specification is designed to be read by a general audience. Readers should understand the system, but no particular technical knowledge should be required to understand the document.
Functional requirements should include functions performed by specific screens, outlines of work-flows performed by the system, and other business or compliance requirements the system must meet. Download an example functional requirements specification or use these quick examples below.
Interface requirements
- Field 1 accepts numeric data entry.
- Field 2 only accepts dates before the current date.
- Screen 1 can print on-screen data to the printer.
Business Requirements
- Data must be entered before a request can be approved.
- Clicking the Approve button moves the request to the Approval Workflow.
- All personnel using the system will be trained according to internal SOP AA-101.
Regulatory/Compliance Requirements
- The database will have a functional audit trail.
- The system will limit access to authorized users.
- The spreadsheet can secure data with electronic signatures.
Security Requirements
- Members of the Data Entry group can enter requests but can not approve or delete requests.
- Members of the Managers group can enter or approve a request but can not delete requests.
- Members of the Administrators group cannot enter or approve requests but can delete requests.
Depending on the system being described, different categories of requirements are appropriate. System Owners, Key End-Users, Developers, Engineers, and Quality Assurance should all participate in the requirement gathering process, as appropriate to the system.
Requirements outlined in the Functional Requirements Specification are usually tested in the Operational Qualification.
Additional Comments
The Functional Requirements Specification describes what the system must do; how the system does it is described in the Design Specification.
If a User Requirement Specification was written, all requirements outlined in the User Requirement Specification should be addressed in the Functional Requirements Specification.
The Functional Requirements Specification should be signed by the System Owner and Quality Assurance. If key end-users, developers, or engineers were involved with developing the requirements, it may be appropriate to have them sign and approve the document as well.
Depending on the size and complexity of the program, the Functional Requirements Specification document can be combined with either the user requirements specification or the design specification.
Design Specifications (DS) - describe how a system performs the requirements outlined in the Functional Requirements. Depending on the system, this can include instructions on testing specific requirements, configuration settings, or review of functions or code. All requirements outlined in the functional specification should be addressed; linking requirements between the functional requirements and design specification is performed via the Traceability Matrix.
Good requirements are objective and testable. Design Specifications may include:
- Specific inputs, including data types, to be entered into the system
- Calculations/code used to accomplish defined requirements
- Outputs generated from the system
- Explaining technical measures to ensure system security
- Identify how the system meets applicable regulatory requirements
System Requirements and verification of the installation process are usually tested in the Installation Qualification. Input, Processing, Output, and Security testing are usually tested in the Operational Qualification.
Due to the extremely technical nature of most design documents, there is currently some discussion in the industry about who needs to review the Design Specification. The Design Specification is reviewed and approved, at minimum, by the System Owner, System Developer, and Quality Assurance. Quality Assurance signs to ensure that the document complies with appropriate regulations and that all requirements were successfully addressed, but they do not necessarily need to review technical information.
Depending on the size and complexity of the program, the design specification may be combined with the functional requirements document.
Test Plan / Test Protocol -
In a validation project, Tests Plans or Test Protocols are used to demonstrate that a system meets requirements previously established in specification, design, and configuration documents. Test Plans document the general testing strategy; Test Protocols are the actual testing documents. In many cases, the Test Plan and Test Protocol are combined into a separate document.
The Test Plan outlines the testing requirements and strategy. It should include the general process for performing the testing, documenting evidence of testing and the process for handling testing failures. The Test Plan may also include the types of testing, descriptions of environments where testing will be performed, who is responsible for testing, equipment or testing that will be used in testing, or other organizational requirements for testing.
Test Protocols describe the specific testing. Test Protocols are collections of Test Cases which check a specific element of the system. Each test case should include the purpose of the test, any pre-requisites that need to be done before testing, and the acceptance criteria for the test.
Each test case is made up of a series of test steps. Each step should include an instruction, an expected result, and the actual result. The instructions should include enough detail so that a tester can consistently perform the required testing activity. There should also be a place for the tester to assess whether each step passes or fails.
The process of following the instructions and recording the results is called “executing” the protocol. When executing test protocols, the tester should follow established Good Documentation Practices. This includes using a compliant computer system to record the testing results or documenting the results on paper and pen. Any discrepancy between the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is complete.
Software validation usually uses three specific testing protocols:
- Installation Qualifications (IQ) verify that systems are on machines suited to run the software, that the system has been properly installed and that the configuration is correct. These requirements are outlined in the Design Specification.
- Operational Qualifications (OQ) verify that systems perform as expected. The OQ tests requirements outlined in the Functional Requirements.
- Performance Qualifications (PQ) verify that systems perform tasks in real-world conditions. The PQ tests requirements outlined in the User Requirement Specification.
Engineering Validations sometimes use two additional testing protocols:
- Factory Acceptance Test (FAT) – Factory acceptance tests are an attempt to verify that the equipment meets requirements outlined in the User Requirement Specification or Functional Requirements. FATs are performed at the point of assembly. Customers will often ask to be present for FAT, though the tests are usually performed by the manufacturer. Many companies do not allow the company to ship the item without passing the factory acceptance test, and some contractual payments are dependent upon the item passing FAT.
- User Acceptance Test (UAT) or Site Acceptance Test (SAT) – User and site acceptance tests verify that the item performs as required by the User Requirement Specification or Functional Requirements. Once an item passes UAT/SAT, it is ready for use, unless other contractual arrangements are made between the user and the vendor.
Test Protocols should be approved before protocol execution. A copy of the unexecuted protocol should be kept in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The executed protocol should be signed by the tester and reviewed by the system owner and Quality.
Installation Qualification (IQ) Protocol verifies the proper installation and configuration of a System. This can include ensuring that necessary files have been loaded, equipment has been installed, the necessary procedures have been approved, or the appropriate personnel have been trained. The requirements to properly install the system were defined in the Design Specification. Installation Qualification must be performed before completing the Operational Qualification or Performance Qualification.
Depending on your needs and the complexity of the system, Installation Qualification can be combined with Operational Qualification or Performance Qualification.
Installation Qualification protocols should be approved before protocol execution. A copy of the unexecuted protocol should be kept in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The executed protocol should be signed by the tester and reviewed by the system owner and Quality.
Installation Qualification might test:
- That the operating system has the appropriate processor, RAM, etc.
- That all files required to run the system are present
- That all documentation required to train system personnel has been approved
Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is complete.
For more examples, see our installation qualification template.
Operational Qualification (OQ) Protocol is a collection of test cases used to verify the proper functioning of a system. The operational qualification test requirements are defined in the Functional Requirements Specification. Operational Qualification is usually performed before the system is released for use.
Depending on your needs and the complexity of the system, Operation Qualification can be combined with Installation Qualification or Performance Qualification.
Operational Qualifications should be approved before protocol execution. A copy of the unexecuted protocol should be kept in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The executed protocol should be signed by the tester and reviewed by the system owner and Quality.
For example, the operational qualification might test:
- That each screen accepts the appropriate data
- That an item can be moved through an entire workflow
- That system security has been properly implemented
- That all technological controls for compliance with 21 CFR 11 are functioning as expected
Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is complete.
Performance Qualification (PQ) - are a collection of test cases used to verify that a system performs as expected under simulated real-world conditions. The performance qualification tests requirements defined in the User Requirements Specification (or possibly the Functional Requirements Specification). Sometimes the performance qualification is performed by power users as the system is being released.
Depending on your needs and the complexity of the system, Performance Qualification can be combined with Installation Qualification or Operational Qualification.
Performance Qualifications should be approved before protocol execution. A copy of the unexecuted protocol should be kept in the validation package. The unexecuted protocol should be approved by the System Owner and Quality Assurance. The executed protocol should be signed by the tester and reviewed by the system owner and Quality.
For example, a performance qualification might demonstrate:
- That a system can handle multiple users without significant system lag
- That when the system contains large quantities of data, queries are returned in a certain (short) period of time
- That concurrent independent work-flows do not affect each other
- That a laboratory test correctly identifies a known material
- That a process was completed within defined system requirements
Each step of the qualification should include an instruction, an expected result, and the actual result. Any discrepancy between the expected result and the actual result should be tracked as a deviation. Deviations should be resolved before validation is complete.
Requirements Traceability Matrix (Trace Matrix, RTM, TM) - a document that links requirements throughout the validation process. The purpose of the Requirements Traceability Matrix is to ensure that all requirements defined for a system are tested in the test protocols. The traceability matrix is a tool both for the validation team, to ensure that requirements are not lost during the validation project, and for auditors, to review the validation documentation.
The requirements traceability matrix is usually developed in concurrence with the initial list of requirements (either the User Requirements Specification or Functional Requirements Specification). As the Design Specifications and Test Protocols are developed, the traceability matrix is updated to include the updated documents. Ideally, requirements should be traced to the specific test step in the testing protocol in which they are tested.
The traceability matrix can either reference the requirement identifiers (unique numbers for each requirement) or the actual requirement itself. In the example shown below, requirements are traced between a Functional Requirements Specification, Design Specification, and Operational Qualification.
Functional Requirements |
Design Specifications |
Test Cases |
The program will have a functional audit trail. |
Each form will use fxn_Audit_Trail in the OnUpdate event procedure. |
OQ, Test Case 3, Step 52: Audit Trail Verification |
In more complicated systems, the traceability matrix may include references to additional documentation, including user requirements, risk assessments, etc.
The traceability matrix can be created and maintained in an automated tool, in an Excel spreadsheet, or MS Word table.
Protocol Test Deviations- When the actual results of a test step in a Test Protocol do not match the expected results, this is called a Deviation.
Deviation reports should include:
- Description – How the actual results differ from the expected results
- Root Cause – What caused the deviation
- Corrective Action – What changes were made to the testing protocol or the system to correct the deviation
Deviations should be reviewed and the solution approved by the System Owner and Quality Assurance. Deviations do not necessarily need to be separate documents, but a system should be in place to ensure that all deviations are addressed before a validation project is resolved. An organization’s control over their deviation process is often reflective of their Quality organization as a whole; thus, regulatory auditors will often focus on the deviation process.
Validation Summary Report (Validation Report, Summary Report, VR, SR)
provide an overview of the entire validation project. Once the summary report is signed, the validation project is considered to be complete. When regulatory auditors review validation projects, they typically begin by reviewing the summary report.
When validation projects use multiple testing systems, some organizations will produce a testing summary report for each test protocol, then summarize the project with a final Summary Report.
The amount of detail in the reports should reflect the relative complexity, business use, and regulatory risk of the system. The report is often structured to mirror the validation plan that initiated the project.
The report is reviewed and signed by the system owner and Quality.
The collection of documents produced during a validation project is called a Validation Package. Once the validation project is complete, all validation packages should be stored according to your site document control procedures. Summary reports should be approved by the System Owner and Quality Assurance.
The validation summary report should include:
- A description of the validation project, including the project scope
- All test cases performed, including whether those test cases passed without issue
- All deviations reported, including how those deviations were resolved
- A statement whether the system met the defined requirements
Change Control for Validated Systems - Change Control is a general term describing the process of managing how changes are introduced into a controlled System. Change control demonstrates to regulatory authorities that validated systems remain under control during and after system changes. Change Control systems are a favorite target of regulatory auditors because they vividly demonstrate an organization’s capacity to control its systems.
Organizations need to explicitly define their processes for evaluating changes to validated systems. There should be a well defined, multidisciplinary approach to considering the effects from proposed changes. Some changes, such as adding a data field to a form or report may be very minor; other changes, such as altering how a program stores and organizes data can be quite extensive. Before changes are implemented, organizations should document the expected outcomes of the changes and have an established plan to implement and test the change and update any existing validation documentation. Part of defining the process for evaluating change control should include the requirements for implementing minor, major and critical changes. This allows the organization to focus proportionate validation resources to the change effort.
One useful tool to determine the extent of revalidation is Risk Assessment. By reviewing the original validation requirements, and evaluating the new risks introduced through the changes to the system, the Risk Assessment process can help determine which sections of the system will need re-testing. If the risk assessment determines that the change is minor or does not affect the system requirements, only limited testing, focused on the affected system object would be required to demonstrate that the system has maintained its validated state. Major changes will require additional re-validation and critical changes could trigger and entire re-validation of a system.
Typical Steps in a Change Control project are:
- Request the Change – The System Owner formally requests a change to the system.
- Assess the Impact of the Change – Before the change is made, the system owner and other key stake holders, including Quality, determine how the change will affect the system.
- System Development in a Safe Environment – Changes should be initially made away from the validated system. For computer systems, this can mean testing in a Sandbox environment. For equipment, process or method validations, this usually means implementing the change during a period when manufacturing has shut down.
- System Testing/Re-Validation – Before changes are accepted, the system is validated to ensure system accuracy, reliability and consistent intended performance.
- Implementation of the Change – The changed system is released to the site and users are trained on changes to the system. For computer systems, this means pushing the changes out to general users. For equipment, process or method validation, this means introducing the system into the larger production process.
Validation Terminology A list of common validation terminology.
Actual Result – What a system does when a particular action is performed
Deliverable – A tangible or intangible object produced as a result of project execution, as part of an obligation. In validation projects, deliverables are usually documents.
Deviation – When a system does not act as expected
End-User – A person who uses the validated system
Expected Result – What a system should do when a particular action is performed
Protocol – A collection of Test Cases, used to document the testing of a system
Qualification – A testing protocol which designates that a system meets a particular collection of requirements. An Installation Qualification ensures that a system has been properly installed. An Operational Qualification demonstrates that a system functions as expected in a controlled environment. A Performance Qualification verifies that a system works under real-life conditions.Quality Assurance – Members of the organization who are tasked with ensuring the quality of materials produced at that organization. GxP organizations are required to have robust and independent Quality Assurance operations. Depending on the organization, this group may be titled Quality Control or Quality Organization; other organizations have multiple groups dedicated to quality with their own distinct missions.
Requirement – Something a system must be able to do
Retrospective Validation – Validation of an existing system. Retrospective validations are usually performed in response to a new need for a system to be compliant or an identified gap in GxP compliance.
Specification – A document outlining the requirements for a system. Specifications are usually sub-divided into User Requirements Specifications, Functional Requirements, and Design Specifications.
System – Object or process undergoing validation. In these pages, system is intended to be a generic term, meaning computer system, equipment, method or process to be validated.System Owner – The individual who is ultimately responsible for a system
Test Case – A documented procedure, used to test that a system meets a particular requirement or collection of requirements
Test Plan – A general testing methodology established to ensure that a system meets requirements. A Test Plan can also refer to the collection of protocols or qualifications used to test and document that a system meets requirements.
Test Step – An individual line of a Test Case. Each Test Step should include instructions, an expected result, and an actual result.
Traceability – The ability to ensure that requirements outlined in the specifications have been tested. This is usually recorded in a Requirements Traceability Matrix.
Validation – A documented process, testing a system to demonstrate and ensure its accuracy, reliability, and consistent intended performance
Validation Package – A collection of documents produced during a validation project
Common Validation Acronyms
CC – Change Control
DS – Design Specification
FAT – Factory Acceptance Testing
FS – Functional Specification
FRS – Functional Requirement Specification (See Functional Specification)
GCP – Good Clinical Practice, a collection of quality guidelines for clinical operations
GLP – Good Laboratory Practice, a collection of quality guidelines for pharmaceutical laboratory operations
GMP – Good Manufacturing Practice, a collection of quality guidelines for pharmaceutical manufacturing operations
GxP – An abbreviation combining GCP, GLP, and GMP. Sometimes also called cGxP, Current Good Practices
IQ – Installation Qualification
IOPQ – Installation/Operational/Performance Qualification
IOQ – Installation/Operational Qualification
PQ – Performance Qualification
OPQ – Operational/Performance Qualification
OQ – Operational Qualification
RTM – Requirement Traceability Matrix
SAT – Site Acceptance Testing
SDS – Software Design Specification (See Design Specification)
Spec – Specification
TM – Traceability Matrix
UAT – User Acceptance Testing
URS – User Requirement Specification
VMP – Validation Master Plan
VP – Validation Plan
Validation FAQ (Frequent Asked Questions about Validation)
Q: When do I need to validate my systems?
A: Validation is required when your system (computer system, equipment, process, or method) is used in a GxP process or used to make decisions about the quality of the product. In addition, if the system is used to generate information for submissions to regulatory bodies like the FDA, the system needs to be validated.
Q: How does validation add value to my system?
Validation adds value to systems by demonstrating that the system will perform as expected. Validation also removes the risk of regulatory non-compliance.
Q: Do I need to validate my computer system?
A: Computer system validation is required for systems used to store electronic records, according to FDA 21 CFR Part 11.10(a) and Annex 11 Paragraph 4.
Q: Where do I find the rules for validating pharmaceutical manufacturing processes and equipment?
A: Guidelines for validation for pharmaceutical manufacturing are in FDA 21 CFR 211.
Q: What federal rules are in place regulating Quality Systems?
A: Quality System regulation is located in FDA 21 CFR 820.
Q: Why are there so many documents?
A: Proper documentation is required to demonstrate that the system was tested, including validation planning, protocol execution, and quality review. From a regulatory auditor’s point of view, if you don’t document what you did, you didn’t do it.
Q: Am I allowed to change a validated system?
A: Changing validated systems requires Change Control to ensure that there are no unexpected or unrecorded changes to the system.
Q: What is GAMP?
A: GAMP is an acronym for Good Automated Manufacturing Practices. GAMP contains a collection of industry best practices for validation.
Q: What is ICH?
A: ICH is an acronym for the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use. ICH is a collaboration of regulatory authorities from the United States, Europe, Japan, and members of the pharmaceutical industry. ICH also issues industry best practices for validation.
Q: Do you do validations?
Yes. Ofni Systems is an industry-recognized leader in computer validation.
Q: Do you have tools to facilitate our validation process?
A: Yes. The FastVal Validation Document Generator can improve the quality of your validation documentation and help you complete validation projects in 70% less time than traditional validation methods.
[src: http://www.ofnisystems.com/services/validation/validation-master-plans/]